Classification with Invariant Distance Substitution Kernels

نویسندگان

  • Bernard Haasdonk
  • Hans Burkhardt
چکیده

Kernel methods offer a flexible toolbox for pattern analysis and machine learning. A general class of kernel functions which incorporates known pattern invariances are invariant distance substitution (IDS) kernels. Instances such as tangent distance or dynamic time-warping kernels have demonstrated the real world applicability. This motivates the demand for investigating the elementary properties of the general IDS-kernels. In this paper we formally state and demonstrate their invariance properties, in particular the adjustability of the invariance in two conceptionally different ways. We characterize the definiteness of the kernels. We apply the kernels in different classification methods, which demonstrates various benefits of invariance.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning with Distance Substitution Kernels

During recent years much effort has been spent in incorporating problem specific a-priori knowledge into kernel methods for machine learning. A common example is a-priori knowledge given by a distance measure between objects. A simple but effective approach for kernel construction consists of substituting the Euclidean distance in ordinary kernel functions by the problem specific distance measu...

متن کامل

Transformation knowledge in pattern analysis with kernel methods: distance and integration kernels

Modern techniques for data analysis and machine learning are so called kernel methods. The most famous and successful one is represented by the support vector machine (SVM) for classification or regression tasks. Further examples are kernel principal component analysis for feature extraction or other linear classifiers like the kernel perceptron. The fundamental ingredient in these methods is t...

متن کامل

D2KE: From Distance to Kernel and Embedding

For many machine learning problem settings, particularly with structured inputs such as sequences or sets of objects, a distance measure between inputs can be specified more naturally than a feature representation. However, most standard machine models are designed for inputs with a vector feature representation. In this work, we consider the estimation of a function f : X → R based solely on a...

متن کامل

Translation-invariant bilinear operators with positive kernels

We study L (or Lr,∞) boundedness for bilinear translation-invariant operators with nonnegative kernels acting on functions on R. We prove that if such operators are bounded on some products of Lebesgue spaces, then their kernels must necessarily be integrable functions on R, while via a counterexample we show that the converse statement is not valid. We provide certain necessary and some suffic...

متن کامل

Kernel Methods on Approximate Infinite-Dimensional Covariance Operators for Image Classification

This paper presents a novel framework for visual object recognition using infinite-dimensional covariance operators of input features in the paradigm of kernel methods on infinite-dimensional Riemannian manifolds. Our formulation provides in particular a rich representation of image features by exploiting their non-linear correlations. Theoretically, we provide a finite-dimensional approximatio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007